60 research outputs found

    The limit empirical spectral distribution of Gaussian monic complex matrix polynomials

    Full text link
    We define the empirical spectral distribution (ESD) of a random matrix polynomial with invertible leading coefficient, and we study it for complex n×nn \times n Gaussian monic matrix polynomials of degree kk. We obtain exact formulae for the almost sure limit of the ESD in two distinct scenarios: (1) n→∞n \rightarrow \infty with kk constant and (2) k→∞k \rightarrow \infty with nn constant. The main tool for our approach is the replacement principle by Tao, Vu and Krishnapur. Along the way, we also develop some auxiliary results of potential independent interest: we slightly extend a result by B\"{u}rgisser and Cucker on the tail bound for the norm of the pseudoinverse of a non-zero mean matrix, and we obtain several estimates on the singular values of certain structured random matrices.Comment: 25 pages, 4 figure

    Computing the common zeros of two bivariate functions via Bezout resultants

    Get PDF
    The common zeros of two bivariate functions can be computed by finding the common zeros of their polynomial interpolants expressed in a tensor Chebyshev basis. From here we develop a bivariate rootfinding algorithm based on the hidden variable resultant method and B�ezout matrices with polynomial entries. Using techniques including domain subdivision, B�ezoutian regularization and local refinement we are able to reliably and accurately compute the simple common zeros of two smooth functions with polynomial interpolants of very high degree (�≥\ge 1000). We analyze the resultant method and its conditioning by noting that the B�ezout matrices are matrix polynomials. Our robust algorithm is implemented in the roots command in Chebfun2, a software package written in object-oriented MATLAB for computing with bivariate functions

    Vector spaces of linearizations for matrix polynomials: A bivariate polynomial approach

    Get PDF
    We revisit the important paper [D. S. Mackey, N. Mackey, C. Mehl, and V. Mehrmann, SIAM J. Matrix Anal. Appl., 28 (2006), pp. 971-1004] and, by viewing matrices as coefficients for bivariate polynomials, we provide concise proofs for key properties of linearizations for matrix polynomials. We also show that every pencil in the double ansatz space is intrinsically connected to a Bézout matrix, which we use to prove the eigenvalue exclusion theorem. In addition our exposition allows for any degree-graded basis, the monomials being a special case. MATLAB code is given to construct the pencils in the double ansatz space for matrix polynomials expressed in any orthogonal basis

    Fiedler-comrade and Fiedler--Chebyshev pencils

    Get PDF
    Fiedler pencils are a family of strong linearizations for polynomials expressed in the monomial basis, that include the classical Frobenius companion pencils as special cases. We generalize the definition of a Fiedler pencil from monomials to a larger class of orthogonal polynomial bases. In particular, we derive Fiedler-comrade pencils for two bases that are extremely important in practical applications: the Chebyshev polynomials of the first and second kind. The new approach allows one to construct linearizations having limited bandwidth: a Chebyshev analogue of the pentadiagonal Fiedler pencils in the monomial basis. Moreover, our theory allows for linearizations of square matrix polynomials expressed in the Chebyshev basis (and in other bases), regardless of whether the matrix polynomial is regular or singular, and for recovery formulas for eigenvectors, and minimal indices and bases

    On the stability of computing polynomial roots via confederate linearizations

    Get PDF
    A common way of computing the roots of a polynomial is to find the eigenvalues of a linearization, such as the companion (when the polynomial is expressed in the monomial basis), colleague (Chebyshev basis) or comrade matrix (general orthogonal polynomial basis). For the monomial case, many studies exist on the stability of linearization-based rootfinding algorithms. By contrast, little seems to be known for other polynomial bases. This paper studies the stability of algorithms that compute the roots via linearization in nonmonomial bases, and has three goals. First we prove normwise stability when the polynomial is properly scaled and the QZ algorithm (as opposed to the more commonly used QR algorithm) is applied to a comrade pencil associated with a Jacobi orthogonal polynomial. Second, we extend a result by Arnold that leads to a first-order expansion of the backward error when the eigenvalues are computed via QR, which shows that the method can be unstable. Based on the analysis we suggest how to choose between QR and QZ. Finally, we focus on the special case of the Chebyshev basis and finding real roots of a general function on an interval, and discuss how to compute accurate roots. The main message is that to guarantee backward stability QZ applied to a properly scaled pencil is necessary

    Chebyshev rootfinding via computing eigenvalues of colleague matrices: when is it stable?

    Get PDF
    Computing the roots of a scalar polynomial, or the eigenvalues of a matrix polynomial, expressed in the Chebyshev basis {Tk(x)} is a fundamental problem that arises in many applications. In this work, we analyze the backward stability of the polynomial rootfinding problem solved with colleague matrices. In other words, given a scalar polynomial p(x) or a matrix polynomial P(x) expressed in the Chebyshev basis, the question is to determine whether or not the whole set of computed eigenvalues of the colleague matrix, obtained with a backward stable algorithm, like the QR algorithm, are the set of roots of a nearby polynomial. In order to do so, we derive a first order backward error analysis of the polynomial rootfinding algorithm using colleague matrices adapting the geometric arguments in [A. Edelman and H. Murakami, Polynomial roots for companion matrix eigenvalues, Math. Comp. 210, 763-776, 1995] to the Chebyshev basis. We show that, if the absolute value of the coefficients of p(x) (respectively, the norm of the coefficients of P(x)) are bounded by a moderate number, computing the roots of p(x) (respectively, the eigenvalues of P(x)) via the eigenvalues of its colleague matrix using a backward stable eigenvalue algorithm is backward stable. This backward error analysis also expands on the very recent work [Y. Nakatsukasa and V. Noferini, On the stability of computing polynomial roots via confederate linearizations, Math. Comp. 85 (2016), no. 301, 2391-2425] that already showed that this algorithm is not backward normwise stable if the coefficients of the polynomial p(x) do not have moderate norms
    • …
    corecore